A time series \({X_t}\) is called weekly stationary if
\(\mu_X(t)\) is independent of \(t\).
\(\gamma_X(t+h, t)\) is independent of \((t)\) for each \((h)\).
In other words the statistical properties of the time series (mean, variance, autocorrelation, etc.) do not depend on the time at which the series is observed, that is no trend or seasonality. However, a time series with cyclic behaviour (but with no trend or seasonality) is stationary.
Strict stationarity of a time series
A time series \(\{X_t\}\) is called weekly stationary if the random vector \([X_1, X_2..., X_n]\) and \([X_{1+h}, X_{2+h}..., X_{n+h}]\) have the same joint distribution for all integers \((h)\) and \((n > 0)\).
Simple time series models
1. iid noise
no trend or seasonal component
observations are independent and identically distributed (iid) random variables with zero mean.
Notation: \({X_t} \sim IID(0, \sigma^2)\)
plays an important role as a building block for more complicated time series.
If \({X_t}\) is a sequence of uncorrelated random variables, each with zero mean and variance \(\sigma^2\), then such a sequence is referred to as white noise.
Note: Every \((IID(0, \sigma^2)\) sequence is \((WN(0, \sigma^2)\) but not conversely.
\[\nabla X_t - \nabla X_{t-1}=(X_t-X_{t-1})-(X_{t-1}-X_{t-2})\] - In practice, we seldom need to go beyond second order differencing.
Seasonal differencing
differencing between an observation and the corresponding observation from the previous year.
\[\nabla_mX_t=X_t-X_{t-m}=(1-B^m)X_t\] where \((m)\) is the number of seasons. For monthly, \((m=12)\), for quarterly \((m=4)\).
For monthly series
\[\nabla_{12}X_t=X_t-X_{t-12}\]
Twice-differenced series
\[\nabla^2_{12}X_t=\nabla_{12}X_t-\nabla_{12}X_{t-1}\]\[\nabla_{12}X_t-\nabla_{12}X_{t-1}=(X_t-X_{t-12})-(X_{t-1}-X_{t-13})\] If seasonality is strong, the seasonal differencing should be done first.
Non-Stationary Time Series
1. Deterministic trend
\[Y_t = f(t) + \epsilon_t\]
where \(\epsilon_t \sim iid(0, \sigma^2)\), \(t = 1, 2, ...T\)
Mean of the process is time dependent, but the variance of the process is constant.
A trend is deterministic if it is a nonrandom function of time.
Non-Stationary Time Series (cont.)
2. Random walk
\[Y_t = Y_{t-1} + \epsilon_t\]
Random walk has a stochastic trend.
Model behind naive method.
A trend is said to be stochastic if it is a random function of time.
Non-Stationary Time Series (cont.)
3. Random walk with drift
\[Y_t = \alpha+ Y_{t-1} + \epsilon_t\]
Random walk with drift has a stochastic trend and a deterministic trend.
It has a deterministic trend\((Y_0 + t \alpha)\) and a stochastic trend\(\sum_{i=1}^{t} \epsilon_t\).
Mean: \(E(Y_t) = Y_0 + t\alpha\)
Variance: \(Var(Y_t) = t\sigma^2\).
There is a trend in both mean and variance.
Common trend removal (de-trending) procedures
Deterministic trend: Time-trend regression
The trend can be removed by fitting a deterministic polynomial time trend. The residual series after removing the trend will give us the de-trended series.
Stochastic trend: Differencing
The process is also known as a Difference-stationary process.
Notation: I(d)
Integrated to order \(d\): Series can be made stationary by differencing \(d\) times.
Known as \(I(d)\) process.
Question: Show that random walk process is an \(I(1)\) process.
The random walk process is called a unit root process. (If one of the roots turns out to be one, then the process is called unit root process.)
Random walk
import numpy as nprw = np.cumsum(samples)plt.plot(rw)plt.show()